Web Survey Bibliography
Abstract: Cloud computing represents a new way to deploy computing technology, where dynamically scalable and virtualized resources are provided as a service over the Internet. Amazon Elastic Cloud (EC2) is an example of Infrastructure-as-a-Service that anyone can use today to access infinite computing capacity on demand. This new environment enables collaboration, resources sharing and provides the tools for traceable and reproducible computational research. This model of allocating processing power holds the promise of a revolution in scientific and statistical computing.
Bringing this new era for research and education still requires new software that bridges the gap between the scientist’s everyday tools and the cloud. For instance, making R available as a service in the cloud and allowing its use without any memory or computing constraints would benefit the broad population of statisticians and research professionals. This is what Elastic–R (www.elasticr.net) delivers. It provides a Google docs-like portal and workbench for data analysis that makes using R on the cloud even simpler than using it locally. It enables scientists, educators and students to allocate cloud resources seamlessly work with R engines and use their full capabilities from within any standard web browser.
Features include real time collaboration, sharing and re-using virtual machines, sessions, data, functions, spreadsheets, dashboards, and automatically generated macro enabled Word documents and Excel workbooks which can be synchronized in real-time with R engines on the cloud. Computationally intensive algorithms can easily be run on any number of virtual machines that are controlled from within a standard R session. Elastic-R is also an applications platform that allows anyone to assemble statistical methods and data with interactive user interfaces for the end user. These interfaces and dashboards are created visually, and are automatically published and delivered as simple web applications.
In financial environments, this allows analysts to share common data sources and dashboards and to mirror them in a familiar office environment. In an industrial environment, it allows sharing data and analyses among different production and research sites which may not have the same computing environment. Finally, since the proposed computing architecture uses a cloud as a work horse, large scale and resource demanding calculations can be carried out at a on-demand basis without the need of installing high performance computing systems locally.
Conference Homepage (abstract)
Web survey bibliography (364)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Comparing acquiescent and extreme response styles in face-to-face and web surveys; 2017; Liu, M.; Conrad, F. G.; Lee, S.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- Methods for Evaluating Respondent Attrition in Web-Based Surveys; 2016; Hochheimer, C. J.; Sabo, R. T.; Krist, A. H.; Day, T.; Cyrus, J.; Woolf, S. H.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Using official surveys to reduce bias of estimates from nonrandom samples collected by web surveys; 2016; Beresovsky, V.; Dorfman, A.; Rumcheva, P.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- Helping respondents provide good answers in Web surveys; 2016; Couper, M. P.; Zhang, C.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Research gamification for quality pharmaceutical stakeholder insights; 2016; Mondry, B.; Fink, L.
- SurveyTester from Knowledge Navigators ; 2016; Macer, T.
- Simplifying your mobile solution; 2016; Berry, K.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Why Do Web Surveys Take Longer on Smartphones?; 2016; Couper, M. P.; J. J.Peterson, G. J.
- Usability Testing within Agile Process; 2016; Holland, T.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Thinking Inside the Box Visual Design of the Response Box Affects Creative Divergent Thinking in an...; 2016; Mohr, A. H.; Sell, A.; Lindsay, T.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.